Goto

Collaborating Authors

 gini score


Gini Score under Ties and Case Weights

Brauer, Alexej, Wüthrich, Mario V.

arXiv.org Machine Learning

The Gini score is a popular statistical tool in model validation. The Gini score has originally been introduced and used for binary responses Y {0, 1}, and there are many equivalent formulations of the (binary) Gini score such as the receiver operating curve (ROC) and the area under the curve (AUC); see, e.g., [Bamber (1975)], [Hanley-McNeil (1982)] and [Fawcett (2006)]. These different formulations are also equivalent to the Wilcoxon-Mann-Whitney's U statistic, see [Hanley-McNeil (1982)], [DeLong et al. (1988)], [Byrne (2016)], and to [Somers (1962)]'s D, see [Newson (2002)]. Thus, there are at least five equivalent formulations of the Gini score in a binary context, and there is a broad literature on its behavior which is well understood. When it comes to general real-valued responses, things become more difficult, and definitions and results on the Gini score are mainly found in the credit risk and actuarial literature. In this stream of literature, the Gini score has been introduced by [Gourieroux-Jasiak (2007)], [Frees et al. (2011), Frees et al. (2013)]. Furthermore, in the real-valued setting the Gini score is studied in much detail in [Denuit et al. (2019)] and [Denuit-Trufin (2021)]. The Gini score is a statistic that assesses whether a given risk ranking is correct.


On-sensor Printed Machine Learning Classification via Bespoke ADC and Decision Tree Co-Design

Armeniakos, Giorgos, Duarte, Paula L., Pal, Priyanjana, Zervakis, Georgios, Tahoori, Mehdi B., Soudris, Dimitrios

arXiv.org Artificial Intelligence

Printed electronics (PE) technology provides cost-effective hardware with unmet customization, due to their low non-recurring engineering and fabrication costs. PE exhibit features such as flexibility, stretchability, porosity, and conformality, which make them a prominent candidate for enabling ubiquitous computing. Still, the large feature sizes in PE limit the realization of complex printed circuits, such as machine learning classifiers, especially when processing sensor inputs is necessary, mainly due to the costly analog-to-digital converters (ADCs). To this end, we propose the design of fully customized ADCs and present, for the first time, a co-design framework for generating bespoke Decision Tree classifiers. Our comprehensive evaluation shows that our co-design enables self-powered operation of on-sensor printed classifiers in all benchmark cases.


Improved Financial Forecasting via Quantum Machine Learning

Thakkar, Sohum, Kazdaghli, Skander, Mathur, Natansh, Kerenidis, Iordanis, Ferreira-Martins, André J., Brito, Samurai

arXiv.org Artificial Intelligence

Quantum computing is a rapidly evolving field that promises to revolutionize various domains, and finance is no exception. There is a variety of computationally hard financial problems for which quantum algorithms can potentially offer advantages [24, 16, 39, 6], for example in combinatorial optimization [34, 42], convex optimization [30, 43], monte carlo simulations [15, 44, 21], and machine learning [41, 18, 1]. In this work, we explore the potential of quantum machine learning methods in improving the performance of forecasting in finance, specifically focusing on two use cases within the business of Itaú Unibanco, the largest bank in Latin America. In the first use case, we aim to improve the performance of Random Forest methods for churn prediction. We introduce quantum algorithms for Determinantal Point Processes (DPP) sampling [29], and develop a method of DPP sampling to enhance Random Forest models.


How To Implement The Decision Tree Algorithm From Scratch In Python - Machine Learning Mastery

#artificialintelligence

Decision trees are a powerful prediction method and extremely popular. They are popular because the final model is so easy to understand by practitioners and domain experts alike. The final decision tree can explain exactly why a specific prediction was made, making it very attractive for operational use. Decision trees also provide the foundation for more advanced ensemble methods such as bagging, random forests and gradient boosting. In this tutorial, you will discover how to implement the Classification And Regression Tree algorithm from scratch with Python. How To Implement The Decision Tree Algorithm From Scratch In Python Photo by Martin Cathrae, some rights reserved.